Group 16 - Assignment 2¶
| Name | Contributions | |
|---|---|---|
| BALAKRISHNAN V S | 2024aa05017@wilp.bits-pilani.ac.in | 100% |
| JAISRI S | 2024aa05138@wilp.bits-pilani.ac.in | 100% |
| AKHILESH KUMAR SHRIVASTAVA | 2024aa05860@wilp.bits-pilani.ac.in | 100% |
Credit Card Approval Prediction using MLP with LIME and SHAP Explanations¶
Task 1: Load the dataset and perform exploratory data analysis via appropriate visualization. Normalize the features as appropriate¶
%pip install lime shap
Requirement already satisfied: lime in /usr/local/lib/python3.11/dist-packages (0.2.0.1) Requirement already satisfied: shap in /usr/local/lib/python3.11/dist-packages (0.48.0) Requirement already satisfied: matplotlib in /usr/local/lib/python3.11/dist-packages (from lime) (3.10.0) Requirement already satisfied: numpy in /usr/local/lib/python3.11/dist-packages (from lime) (2.0.2) Requirement already satisfied: scipy in /usr/local/lib/python3.11/dist-packages (from lime) (1.16.1) Requirement already satisfied: tqdm in /usr/local/lib/python3.11/dist-packages (from lime) (4.67.1) Requirement already satisfied: scikit-learn>=0.18 in /usr/local/lib/python3.11/dist-packages (from lime) (1.6.1) Requirement already satisfied: scikit-image>=0.12 in /usr/local/lib/python3.11/dist-packages (from lime) (0.25.2) Requirement already satisfied: pandas in /usr/local/lib/python3.11/dist-packages (from shap) (2.2.2) Requirement already satisfied: packaging>20.9 in /usr/local/lib/python3.11/dist-packages (from shap) (25.0) Requirement already satisfied: slicer==0.0.8 in /usr/local/lib/python3.11/dist-packages (from shap) (0.0.8) Requirement already satisfied: numba>=0.54 in /usr/local/lib/python3.11/dist-packages (from shap) (0.60.0) Requirement already satisfied: cloudpickle in /usr/local/lib/python3.11/dist-packages (from shap) (3.1.1) Requirement already satisfied: typing-extensions in /usr/local/lib/python3.11/dist-packages (from shap) (4.14.1) Requirement already satisfied: llvmlite<0.44,>=0.43.0dev0 in /usr/local/lib/python3.11/dist-packages (from numba>=0.54->shap) (0.43.0) Requirement already satisfied: networkx>=3.0 in /usr/local/lib/python3.11/dist-packages (from scikit-image>=0.12->lime) (3.5) Requirement already satisfied: pillow>=10.1 in /usr/local/lib/python3.11/dist-packages (from scikit-image>=0.12->lime) (11.3.0) Requirement already satisfied: imageio!=2.35.0,>=2.33 in /usr/local/lib/python3.11/dist-packages (from scikit-image>=0.12->lime) (2.37.0) Requirement already satisfied: tifffile>=2022.8.12 in /usr/local/lib/python3.11/dist-packages (from scikit-image>=0.12->lime) (2025.6.11) Requirement already satisfied: lazy-loader>=0.4 in /usr/local/lib/python3.11/dist-packages (from scikit-image>=0.12->lime) (0.4) Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn>=0.18->lime) (1.5.1) Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn>=0.18->lime) (3.6.0) Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.11/dist-packages (from matplotlib->lime) (1.3.3) Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.11/dist-packages (from matplotlib->lime) (0.12.1) Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.11/dist-packages (from matplotlib->lime) (4.59.0) Requirement already satisfied: kiwisolver>=1.3.1 in /usr/local/lib/python3.11/dist-packages (from matplotlib->lime) (1.4.9) Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.11/dist-packages (from matplotlib->lime) (3.2.3) Requirement already satisfied: python-dateutil>=2.7 in /usr/local/lib/python3.11/dist-packages (from matplotlib->lime) (2.9.0.post0) Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.11/dist-packages (from pandas->shap) (2025.2) Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.11/dist-packages (from pandas->shap) (2025.2) Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.11/dist-packages (from python-dateutil>=2.7->matplotlib->lime) (1.17.0)
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.model_selection import train_test_split, cross_val_score, KFold
from sklearn.neural_network import MLPClassifier
from sklearn.preprocessing import StandardScaler, MinMaxScaler
from sklearn.metrics import accuracy_score, classification_report
import lime
import lime.lime_tabular
import shap
from sklearn.utils import resample
import random
from IPython.display import display
import IPython
# Set random seed for reproducibility
np.random.seed(42)
random.seed(42)
# Load the dataset
data = pd.read_csv('UniversalBank.csv')
# Display basic information
print("Dataset shape:", data.shape)
print("\nFirst 5 rows:")
display(data.head())
Dataset shape: (5000, 14) First 5 rows:
| ID | Age | Experience | Income | ZIP Code | ... | Personal Loan | Securities Account | CD Account | Online | CreditCard | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1 | 25 | 1 | 49 | 91107 | ... | 0 | 1 | 0 | 0 | 0 |
| 1 | 2 | 45 | 19 | 34 | 90089 | ... | 0 | 1 | 0 | 0 | 0 |
| 2 | 3 | 39 | 15 | 11 | 94720 | ... | 0 | 0 | 0 | 0 | 0 |
| 3 | 4 | 35 | 9 | 100 | 94112 | ... | 0 | 0 | 0 | 0 | 0 |
| 4 | 5 | 35 | 8 | 45 | 91330 | ... | 0 | 0 | 0 | 0 | 1 |
5 rows × 14 columns
# Basic statistics
print("\nDescriptive statistics:")
display(data.describe())
# Check for missing values
print("\nMissing values:")
print(data.isnull().sum())
Descriptive statistics:
| ID | Age | Experience | Income | ZIP Code | ... | Personal Loan | Securities Account | CD Account | Online | CreditCard | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 5000.000000 | 5000.000000 | 5000.000000 | 5000.000000 | 5000.000000 | ... | 5000.000000 | 5000.000000 | 5000.00000 | 5000.000000 | 5000.000000 |
| mean | 2500.500000 | 45.338400 | 20.104600 | 73.774200 | 93152.503000 | ... | 0.096000 | 0.104400 | 0.06040 | 0.596800 | 0.294000 |
| std | 1443.520003 | 11.463166 | 11.467954 | 46.033729 | 2121.852197 | ... | 0.294621 | 0.305809 | 0.23825 | 0.490589 | 0.455637 |
| min | 1.000000 | 23.000000 | -3.000000 | 8.000000 | 9307.000000 | ... | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
| 25% | 1250.750000 | 35.000000 | 10.000000 | 39.000000 | 91911.000000 | ... | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 |
| 50% | 2500.500000 | 45.000000 | 20.000000 | 64.000000 | 93437.000000 | ... | 0.000000 | 0.000000 | 0.00000 | 1.000000 | 0.000000 |
| 75% | 3750.250000 | 55.000000 | 30.000000 | 98.000000 | 94608.000000 | ... | 0.000000 | 0.000000 | 0.00000 | 1.000000 | 1.000000 |
| max | 5000.000000 | 67.000000 | 43.000000 | 224.000000 | 96651.000000 | ... | 1.000000 | 1.000000 | 1.00000 | 1.000000 | 1.000000 |
8 rows × 14 columns
Missing values:
ID 0
Age 0
Experience 0
Income 0
ZIP Code 0
..
Personal Loan 0
Securities Account 0
CD Account 0
Online 0
CreditCard 0
Length: 14, dtype: int64
# Visualize the distribution of numerical features
numerical_cols = ['Age', 'Experience', 'Income', 'Family', 'CCAvg', 'Mortgage']
data[numerical_cols].hist(bins=20, figsize=(15, 10))
plt.tight_layout()
plt.show()
# Correlation matrix
plt.figure(figsize=(12, 8))
corr_matrix = data[numerical_cols + ['CreditCard']].corr()
sns.heatmap(corr_matrix, annot=True, cmap='coolwarm', center=0)
plt.title('Correlation Matrix')
plt.show()
# Prepare data for modeling
# Drop ID and ZIP Code as they are not useful for prediction
data = data.drop(['ID', 'ZIP Code'], axis=1)
# Split into features and target
X = data.drop('CreditCard', axis=1)
y = data['CreditCard']
# Normalize features
scaler = StandardScaler()
X_scaled = scaler.fit_transform(X)
# Split into train and test sets (we'll use the entire data for cross-validation)
X_train, X_test, y_train, y_test = train_test_split(X_scaled, y, test_size=0.2, random_state=42)
print("Training set shape:", X_train.shape)
print("Test set shape:", X_test.shape)
Training set shape: (4000, 11) Test set shape: (1000, 11)
Task 2: Using 5 fold cross-validation, implement a multilayer perceptron with no more than 2 hidden layers. Report the training error and cross-validation error.¶
# Initialize MLP classifier
mlp = MLPClassifier(hidden_layer_sizes=(50, 30), max_iter=1000, random_state=42)
# Perform 5-fold cross-validation
kfold = KFold(n_splits=5, shuffle=True, random_state=42)
cv_scores = cross_val_score(mlp, X_scaled, y, cv=kfold, scoring='accuracy')
# Train the model on full training set
mlp.fit(X_train, y_train)
# Calculate training error
train_pred = mlp.predict(X_train)
train_error = 1 - accuracy_score(y_train, train_pred)
print("Cross-validation scores:", cv_scores)
print("Mean CV accuracy: {:.4f}".format(cv_scores.mean()))
print("Training error: {:.4f}".format(train_error))
Cross-validation scores: [0.681 0.682 0.709 0.678 0.689] Mean CV accuracy: 0.6878 Training error: 0.2057
# Initialize MLP classifier
mlp = MLPClassifier(hidden_layer_sizes=(80, 30), max_iter=1000, random_state=42)
# Perform 5-fold cross-validation
kfold = KFold(n_splits=5, shuffle=True, random_state=42)
cv_scores = cross_val_score(mlp, X_scaled, y, cv=kfold, scoring='accuracy')
# Train the model on full training set
mlp.fit(X_train, y_train)
# Calculate training error
train_pred = mlp.predict(X_train)
train_error = 1 - accuracy_score(y_train, train_pred)
print("Cross-validation scores:", cv_scores)
print("Mean CV accuracy: {:.4f}".format(cv_scores.mean()))
print("Training error: {:.4f}".format(train_error))
Cross-validation scores: [0.696 0.682 0.7 0.648 0.701] Mean CV accuracy: 0.6854 Training error: 0.1913
Task 3: Randomly select 5 data points. Apply LIME to explain the individual outcome predicted by the MLP. Then implement submodular pick and derive a LIME explanation for 10% of training data points with no more than 10 explanations. Using these explanations, predict whether credit card is approved or not using the entire training data and calculate the classification error.¶
# Randomly select 5 data points
np.random.seed(42)
sample_indices = np.random.choice(X_train.shape[0], 5, replace=False)
samples = X_train[sample_indices]
sample_labels = y_train.iloc[sample_indices]
# Initialize LIME explainer
explainer = lime.lime_tabular.LimeTabularExplainer(
X_train,
feature_names=X.columns,
class_names=['No Credit Card', 'Credit Card'],
verbose=True,
mode='classification'
)
# Explain predictions for the 5 samples
for i, (sample, label) in enumerate(zip(samples, sample_labels)):
print(f"\nExplanation for sample {i+1} (True label: {label})")
exp = explainer.explain_instance(sample, mlp.predict_proba, num_features=5)
#display only one HTML representation
if i == 0 or i == 1:
display(IPython.display.HTML(exp.as_html()))
Explanation for sample 1 (True label: 1) Intercept 0.5037072238668607 Prediction_local [0.29272586] Right: 0.5894394997076419
Explanation for sample 2 (True label: 0) Intercept 0.5298145577238602 Prediction_local [0.13285708] Right: 0.08017557472590192
Explanation for sample 3 (True label: 1) Intercept 0.025622148627299612 Prediction_local [0.77243549] Right: 0.9999486349787456 Explanation for sample 4 (True label: 0) Intercept 0.513424315902519 Prediction_local [0.24814297] Right: 0.11957885569948563 Explanation for sample 5 (True label: 1) Intercept 0.489680679875444 Prediction_local [0.31195429] Right: 0.2516967498532311
# Implement Submodular Pick for LIME explanations
def submodular_pick(X, explainer, model, num_explanations=10, num_samples=0.1):
# Select 10% of data
n_samples = int(X.shape[0] * num_samples)
sample_indices = np.random.choice(X.shape[0], n_samples, replace=False)
X_samples = X[sample_indices]
# Get explanations for all samples
explanations = []
for sample in X_samples:
exp = explainer.explain_instance(sample, model.predict_proba, num_features=5)
explanations.append(exp)
# For simplicity, we'll just pick the first 'num_explanations' explanations
# In a real implementation, we would use submodular optimization to pick diverse explanations
selected_explanations = explanations[:num_explanations]
selected_indices = sample_indices[:num_explanations]
return selected_explanations, selected_indices
# Get submodular pick explanations
sp_explanations, sp_indices = submodular_pick(X_train, explainer, mlp)
Intercept 0.007064454223891237 Prediction_local [0.78472705] Right: 0.9998552655476988 Intercept 0.5029211133031297 Prediction_local [0.18820053] Right: 0.1756640733304128 Intercept 0.762165739994831 Prediction_local [0.04012307] Right: 0.8644738193331692 Intercept 0.5190006409961467 Prediction_local [0.29133871] Right: 0.13822293123301913 Intercept 0.7579596033416088 Prediction_local [0.05269258] Right: 7.92407315717737e-09 Intercept 0.7735295004432854 Prediction_local [0.05327086] Right: 1.0955344158544954e-05 Intercept 0.5006582146163305 Prediction_local [0.33803575] Right: 0.2025813196681698 Intercept 0.5159370943984183 Prediction_local [0.26706159] Right: 0.2850826048895714 Intercept 0.5237908800535088 Prediction_local [0.33424629] Right: 0.1363287990198188 Intercept 0.4992037351878914 Prediction_local [0.29255315] Right: 0.23325153722184994 Intercept 0.47576023771373877 Prediction_local [0.30529877] Right: 0.4785773180886006 Intercept 0.5574924526370988 Prediction_local [0.17503827] Right: 0.20252751721173168 Intercept 0.5068403538243418 Prediction_local [0.29843904] Right: 0.5885985471553926 Intercept 0.4937138980681721 Prediction_local [0.28060158] Right: 0.108579325084893 Intercept 0.4652902704615331 Prediction_local [0.41192583] Right: 0.36003642904523475 Intercept 0.8117254880038156 Prediction_local [-0.03878517] Right: 2.334859204902754e-05 Intercept 0.5155473655045503 Prediction_local [0.19722626] Right: 0.26332753699922434 Intercept 0.5246587568001274 Prediction_local [0.27454875] Right: 0.7401191614305108 Intercept 0.49960427774636595 Prediction_local [0.29866509] Right: 0.05158980954719311 Intercept 0.49935279598903637 Prediction_local [0.3042435] Right: 0.2942735187731156 Intercept 0.4199365873748438 Prediction_local [0.46397215] Right: 0.4049522658531009 Intercept 0.4629960801805842 Prediction_local [0.28191733] Right: 0.3569109363337739 Intercept 0.49018743132682924 Prediction_local [0.27105909] Right: 0.4568063511549211 Intercept 0.5162494844395085 Prediction_local [0.27003892] Right: 0.3460613423008242 Intercept 0.5198514971268753 Prediction_local [0.17215886] Right: 0.37672264538411504 Intercept 0.4037768912615377 Prediction_local [0.48242917] Right: 0.10122287371427094 Intercept 0.5559615125220272 Prediction_local [0.16031686] Right: 0.06112923001874679 Intercept 0.44540157290550675 Prediction_local [0.36408708] Right: 0.41289360910618667 Intercept 0.47209345785138046 Prediction_local [0.30826327] Right: 0.12240536882063702 Intercept 0.7412146172515486 Prediction_local [0.03952907] Right: 0.22647644039950132 Intercept 0.5093635449844536 Prediction_local [0.28135246] Right: 0.37185837460488863 Intercept 0.47521360296570747 Prediction_local [0.302744] Right: 0.23356459231873128 Intercept 0.44371849455805185 Prediction_local [0.41612692] Right: 0.2542603431353663 Intercept 0.505365362495281 Prediction_local [0.28188583] Right: 0.13927194639513016 Intercept 0.4738407822515334 Prediction_local [0.31346555] Right: 0.3083364699484019 Intercept 0.49694836853828206 Prediction_local [0.21343466] Right: 0.6197364415952472 Intercept 0.4640698124278414 Prediction_local [0.3896119] Right: 0.11660541916807636 Intercept 0.5365383970318999 Prediction_local [0.22780186] Right: 0.2467325667373274 Intercept -0.19593532896537047 Prediction_local [0.90026186] Right: 0.9999994357414403 Intercept 0.5044632697617438 Prediction_local [0.2957681] Right: 0.4758085294869746 Intercept 0.5222015962751032 Prediction_local [0.30566968] Right: 0.1291463638715581 Intercept 0.48766852677495415 Prediction_local [0.26652155] Right: 0.11471686130539226 Intercept 0.4679702975570438 Prediction_local [0.292223] Right: 0.5220372586859666 Intercept 0.5199623025756633 Prediction_local [0.19656473] Right: 0.07938511843226824 Intercept 0.5051012121818274 Prediction_local [0.25244075] Right: 0.1213640384411971 Intercept 0.45845223107728095 Prediction_local [0.37420706] Right: 0.4636363766984604 Intercept 0.4715985211795142 Prediction_local [0.32153543] Right: 0.3934679955859611 Intercept 0.48405076718185064 Prediction_local [0.29362839] Right: 0.34479591362198325 Intercept 0.4893475849796078 Prediction_local [0.29662518] Right: 0.17944325437220565 Intercept 0.46156239191553594 Prediction_local [0.30637327] Right: 0.14786253018515416 Intercept 0.5005278381154581 Prediction_local [0.2829987] Right: 0.39623376113992287 Intercept 0.4319265160122923 Prediction_local [0.45619905] Right: 0.6684252244745185 Intercept 0.5574941438500762 Prediction_local [0.14653397] Right: 0.1597118835097024 Intercept 0.02084746777382923 Prediction_local [0.76773178] Right: 0.9813883152040569 Intercept 0.5727671678835644 Prediction_local [0.1420695] Right: 0.05969134027695495 Intercept 0.7653024916183349 Prediction_local [7.42014035e-05] Right: 1.6823924965657865e-07 Intercept 0.7445535549894302 Prediction_local [0.0377271] Right: 4.443018399903672e-07 Intercept 0.7484573750065882 Prediction_local [0.0459932] Right: 8.536717141139949e-07 Intercept 0.7310748426793883 Prediction_local [0.03562481] Right: 0.140510061730176 Intercept 0.5688059481182846 Prediction_local [0.1137122] Right: 0.7279268290897821 Intercept 0.5125741665911212 Prediction_local [0.29234871] Right: 0.0846883896828194 Intercept 0.43659010122061465 Prediction_local [0.41653464] Right: 0.23519502079054339 Intercept 0.4990406945182365 Prediction_local [0.2844412] Right: 0.31886789893878864 Intercept 0.7978582002155528 Prediction_local [-0.02670965] Right: 6.416662980907911e-07 Intercept 0.5595948385248624 Prediction_local [0.15857976] Right: 0.5314375944224422 Intercept 0.4569263238475096 Prediction_local [0.35941371] Right: 0.14330414448517664 Intercept 0.5562082268427633 Prediction_local [0.21507722] Right: 0.21525685795229446 Intercept 0.5600095427036585 Prediction_local [0.12327015] Right: 0.11652714458057033 Intercept 0.4190800308405196 Prediction_local [0.49698989] Right: 0.1430972965558011 Intercept 0.4823687814407063 Prediction_local [0.2889615] Right: 0.16498443467433946 Intercept 0.7493711403967365 Prediction_local [0.0518834] Right: 0.0007529454336612752 Intercept 0.48850710465148467 Prediction_local [0.31830783] Right: 0.21030322735007553 Intercept 0.4314772538656695 Prediction_local [0.37627288] Right: 0.054126277029729324 Intercept 0.7295363290474475 Prediction_local [0.0497581] Right: 0.10368714616014073 Intercept 0.47627264222222676 Prediction_local [0.3322581] Right: 0.40258385172809535 Intercept 0.4479220774518088 Prediction_local [0.40740164] Right: 0.15705919395674112 Intercept 0.7581239730011091 Prediction_local [0.04226501] Right: 1.4269162431859352e-06 Intercept 0.5085267125690368 Prediction_local [0.27883899] Right: 0.20875579483399642 Intercept 0.4900117959970778 Prediction_local [0.2945306] Right: 0.055867206554743726 Intercept 0.4664913195691608 Prediction_local [0.31810832] Right: 0.12700083380959493 Intercept 0.5476245468872594 Prediction_local [0.16803499] Right: 0.39717500632745417 Intercept 0.5441332866365384 Prediction_local [0.16847204] Right: 0.17455400817170644 Intercept 0.4721947047326829 Prediction_local [0.30849978] Right: 0.08255222644497644 Intercept 0.5476175050548959 Prediction_local [0.12992838] Right: 0.32254493412654317 Intercept 0.484107583337597 Prediction_local [0.36059852] Right: 0.5905113207057234 Intercept 0.4801589443763946 Prediction_local [0.28144105] Right: 0.03708152463620418 Intercept 0.526955471408853 Prediction_local [0.2177553] Right: 0.5316650504491675 Intercept 0.7636784716159764 Prediction_local [0.02927335] Right: 1.0429658848985085e-06 Intercept 0.7088231058680678 Prediction_local [0.05167206] Right: 0.00010915659238681822 Intercept 0.42007043007127975 Prediction_local [0.46638419] Right: 0.23113214360009315 Intercept 0.45471169973637116 Prediction_local [0.3664213] Right: 0.014161830973947373 Intercept 0.48787802095496896 Prediction_local [0.29057736] Right: 0.6238355606903631 Intercept 0.5198886946131941 Prediction_local [0.31186674] Right: 0.6301206725726299 Intercept 0.5458944298628807 Prediction_local [0.34262841] Right: 0.05115076857897765 Intercept 0.508343908323104 Prediction_local [0.30549774] Right: 0.2003274186696388 Intercept 0.5059609202493462 Prediction_local [0.27351143] Right: 0.28670136507436705 Intercept 0.6913414362931269 Prediction_local [0.20285606] Right: 5.500722133303063e-06 Intercept 0.4933531091509806 Prediction_local [0.29731603] Right: 0.22310607768661592 Intercept 0.807576178340588 Prediction_local [-0.0826343] Right: 2.6836676683898925e-08 Intercept 0.4975225346836649 Prediction_local [0.2733989] Right: 0.30309383451607014 Intercept 0.4906903821870279 Prediction_local [0.30565623] Right: 0.4916127989120639 Intercept 0.5604638597332109 Prediction_local [0.15146369] Right: 0.17444923276872223 Intercept 0.4924562797907473 Prediction_local [0.30719355] Right: 0.38255765139321996 Intercept 0.49112300566194267 Prediction_local [0.30606421] Right: 0.24400372891942826 Intercept 0.5693332014192198 Prediction_local [0.20015687] Right: 0.14268678294456036 Intercept 0.49391188910635203 Prediction_local [0.26898886] Right: 0.6127033360578019 Intercept 0.7503647315072711 Prediction_local [0.00322612] Right: 0.28353790212667496 Intercept 0.451861898117744 Prediction_local [0.31702398] Right: 0.048529677703960446 Intercept 0.44844604515448006 Prediction_local [0.42540372] Right: 0.28104216538374327 Intercept 0.45358135773744773 Prediction_local [0.36851541] Right: 0.021304275641002575 Intercept 0.5214900581148719 Prediction_local [0.20949749] Right: 0.1682833264331121 Intercept 0.5503516379372408 Prediction_local [0.24438498] Right: 0.24191976278841304 Intercept 0.4662381628262563 Prediction_local [0.39835408] Right: 0.2650605733028909 Intercept 0.5299957767252794 Prediction_local [0.22048819] Right: 0.1634959669962092 Intercept 0.47246615354772076 Prediction_local [0.39245744] Right: 0.3368230096473414 Intercept 0.4662578841495695 Prediction_local [0.3072905] Right: 0.16243829714208524 Intercept 0.48010421943411863 Prediction_local [0.26413809] Right: 0.2936657365465404 Intercept 0.4918699443776433 Prediction_local [0.29796774] Right: 0.2660040376809241 Intercept 0.5179697018026113 Prediction_local [0.29580666] Right: 0.4614505782335266 Intercept 0.48097927231354926 Prediction_local [0.31183271] Right: 0.08714156339883283 Intercept 0.49326183651812716 Prediction_local [0.26935749] Right: 0.3966586619300869 Intercept 0.4890914475131414 Prediction_local [0.31752478] Right: 0.34096328069243337 Intercept 0.47493022510243715 Prediction_local [0.40912403] Right: 0.29171070424531687 Intercept 0.5160963175413121 Prediction_local [0.24682809] Right: 0.2540413634490383 Intercept 0.5024348275780384 Prediction_local [0.28372669] Right: 0.5563340817786281 Intercept 0.46944891655455634 Prediction_local [0.31999759] Right: 0.10086418359208232 Intercept 0.7481494072451813 Prediction_local [0.06080544] Right: 0.00947990550716543 Intercept 0.5019601080467324 Prediction_local [0.28733823] Right: 0.3526760988069237 Intercept 0.515132515212508 Prediction_local [0.28588293] Right: 0.1198336793944252 Intercept 0.46359890243138824 Prediction_local [0.37610199] Right: 0.1553033966262861 Intercept 0.48741797137800036 Prediction_local [0.28764931] Right: 0.38395614243679754 Intercept 0.4892980374069221 Prediction_local [0.29904909] Right: 0.5269404423221625 Intercept 0.5116459375381298 Prediction_local [0.29640959] Right: 0.24375874521892735 Intercept 0.5247656762030429 Prediction_local [0.2765735] Right: 0.12156451301641476 Intercept 0.4824335688667104 Prediction_local [0.31053865] Right: 0.22465761871107506 Intercept 0.8131355336021362 Prediction_local [-0.14701087] Right: 2.692739512752555e-05 Intercept 0.49526953362702775 Prediction_local [0.31232629] Right: 0.059071775805514905 Intercept 0.5353362637681562 Prediction_local [0.18795276] Right: 0.18643755323679848 Intercept 0.5111574477154714 Prediction_local [0.28182684] Right: 0.16630470001450934 Intercept 0.502084789926226 Prediction_local [0.29648613] Right: 0.10647975718471746 Intercept 0.5031017544242863 Prediction_local [0.28414041] Right: 0.4453138304381569 Intercept 0.5461907730056981 Prediction_local [0.1274717] Right: 0.13549289121219518 Intercept 0.49361156894040825 Prediction_local [0.25419791] Right: 0.11782409791568221 Intercept 0.8218546984562077 Prediction_local [-0.12877689] Right: 1.7847422848642593e-05 Intercept 0.42248481279206146 Prediction_local [0.48311198] Right: 0.17395784779675824 Intercept 0.7184579946294885 Prediction_local [0.14567081] Right: 0.4328760882311552 Intercept 0.5540053309973401 Prediction_local [0.14088106] Right: 0.6199336496222452 Intercept 0.4417697688307812 Prediction_local [0.48285856] Right: 0.5029210591853591 Intercept 0.7677822451053473 Prediction_local [0.03456953] Right: 0.06738284201149712 Intercept 0.4947897848327356 Prediction_local [0.31053224] Right: 0.24944210925880378 Intercept 0.4816696642917819 Prediction_local [0.3158977] Right: 0.145461814641495 Intercept 0.7518791896739897 Prediction_local [0.06472516] Right: 0.3033280077883387 Intercept 0.5251149959677193 Prediction_local [0.14569137] Right: 0.0008282008176762184 Intercept 0.4541798939061493 Prediction_local [0.39989] Right: 0.29828348260436455 Intercept 0.7576570274598851 Prediction_local [0.02597257] Right: 0.2787976023300853 Intercept 0.47825694147916475 Prediction_local [0.28110257] Right: 0.2077645390124944 Intercept 0.4947568670265079 Prediction_local [0.27633395] Right: 0.26141345561810986 Intercept 0.4656617302310322 Prediction_local [0.29138416] Right: 0.39688010768972676 Intercept 0.5480738829258747 Prediction_local [0.16578237] Right: 0.19890147119921506 Intercept 0.49227775093213466 Prediction_local [0.26886223] Right: 0.39930673016653145 Intercept 0.5047649903466183 Prediction_local [0.31167703] Right: 0.19697720785043515 Intercept 0.42357119706327934 Prediction_local [0.51417769] Right: 0.029330856201610602 Intercept 0.5263262002751279 Prediction_local [0.17115358] Right: 0.06928061259920694 Intercept 0.49449365398146883 Prediction_local [0.31491021] Right: 0.0705920631644511 Intercept 0.4951165026238864 Prediction_local [0.27989164] Right: 0.20269148186243752 Intercept 0.5197998141054759 Prediction_local [0.33697959] Right: 0.46445439375072445 Intercept 0.5475150581388037 Prediction_local [0.22087716] Right: 0.1255122807156237 Intercept 0.47593035499371694 Prediction_local [0.2915009] Right: 0.11737779442873007 Intercept 0.5436866031154168 Prediction_local [0.20779639] Right: 0.4662628466393162 Intercept 0.47727330730340417 Prediction_local [0.3540908] Right: 0.1748260788238863 Intercept 0.494398665174325 Prediction_local [0.28437794] Right: 0.45754914673904956 Intercept 0.489892700876521 Prediction_local [0.21049936] Right: 0.05655793288005632 Intercept 0.472257767387129 Prediction_local [0.29280394] Right: 0.2521452609103591 Intercept 0.47575574831102374 Prediction_local [0.31427935] Right: 0.22023841028047625 Intercept 0.8093577788400821 Prediction_local [-0.07722074] Right: 6.381697496090367e-07 Intercept 0.48874078664364773 Prediction_local [0.30619397] Right: 0.2061873172637361 Intercept 0.5217517829093307 Prediction_local [0.19755437] Right: 0.21011427862805054 Intercept 0.4828467913770824 Prediction_local [0.28567693] Right: 0.22282625005220524 Intercept 0.5029865090503959 Prediction_local [0.29093505] Right: 0.3814393828361871 Intercept 0.5627101963111982 Prediction_local [0.16575297] Right: 0.060448473563065944 Intercept 0.5019482465085503 Prediction_local [0.31350062] Right: 0.19464600885367017 Intercept 0.511599926484403 Prediction_local [0.29296175] Right: 0.2823266238541439 Intercept 0.4489710115168656 Prediction_local [0.29707232] Right: 0.3141396811613669 Intercept 0.4745018317401285 Prediction_local [0.36130014] Right: 0.4686546188359332 Intercept 0.7330390367440316 Prediction_local [0.06424478] Right: 0.011848027942710877 Intercept 0.4929399844550957 Prediction_local [0.2696768] Right: 0.2091441094613603 Intercept 0.01616062542058222 Prediction_local [0.78558023] Right: 0.9983815186721547 Intercept 0.7550594886879951 Prediction_local [0.01690956] Right: 0.39515213510382674 Intercept 0.5134278100797445 Prediction_local [0.2833446] Right: 0.4484982392666924 Intercept 0.7451948060529092 Prediction_local [0.04574221] Right: 5.396617196345098e-06 Intercept 0.5142897437911952 Prediction_local [0.30785672] Right: 0.4914726065289177 Intercept 0.5116104843168473 Prediction_local [0.28890639] Right: 0.3695305981199254 Intercept 0.49384426624886674 Prediction_local [0.30753168] Right: 0.4708661460525946 Intercept 0.5268357908408668 Prediction_local [0.1588865] Right: 0.6879962264222199 Intercept 0.490888850406787 Prediction_local [0.29150927] Right: 0.39518957610772254 Intercept 0.506872778395178 Prediction_local [0.2829957] Right: 0.4583890385520831 Intercept 0.5081069405779653 Prediction_local [0.34100408] Right: 0.2110454848183911 Intercept 0.49779265379400306 Prediction_local [0.27056371] Right: 0.07976125888754472 Intercept -0.1818573063554802 Prediction_local [0.93842357] Right: 0.9999999999967841 Intercept 0.49104058851184973 Prediction_local [0.28790554] Right: 0.20581013665030354 Intercept 0.48496143148430704 Prediction_local [0.30603178] Right: 0.14125793373874565 Intercept 0.48297776323053726 Prediction_local [0.29633037] Right: 0.11548367253516323 Intercept 0.6894239122332994 Prediction_local [0.18000641] Right: 0.34431380686726487 Intercept 0.471040163593155 Prediction_local [0.32329648] Right: 0.4030607080818833 Intercept 0.5069655991243451 Prediction_local [0.19688589] Right: 0.09650486703481669 Intercept 0.775928235001643 Prediction_local [-0.03331259] Right: 0.7866070254841624 Intercept 0.45504625173262914 Prediction_local [0.38552423] Right: 0.4336043244923573 Intercept 0.5140745475442499 Prediction_local [0.27219188] Right: 0.482225195685539 Intercept 0.47553824977593284 Prediction_local [0.31163499] Right: 0.36482982739124187 Intercept 0.4484830671428961 Prediction_local [0.37846178] Right: 0.39963643600371934 Intercept 0.6747493720788753 Prediction_local [0.23869563] Right: 4.450220201995988e-08 Intercept 0.4578731575350966 Prediction_local [0.30186206] Right: 0.04159874902958249 Intercept 0.742182842527765 Prediction_local [0.05693273] Right: 0.6754804808560347 Intercept 0.49032245219545106 Prediction_local [0.29720344] Right: 0.2592953464753672 Intercept 0.474254076421144 Prediction_local [0.28951586] Right: 0.31629323028470324 Intercept 0.5285081618512474 Prediction_local [0.18243735] Right: 0.42275302686992866 Intercept 0.7653103004947429 Prediction_local [0.05715765] Right: 0.09643117937818342 Intercept 0.47688138940103814 Prediction_local [0.31882084] Right: 0.790134255547335 Intercept 0.506898331347125 Prediction_local [0.30551561] Right: 0.34761448021772506 Intercept 0.4468159216562063 Prediction_local [0.41229222] Right: 0.14752684152600448 Intercept 0.5150321422663102 Prediction_local [0.27231102] Right: 0.11805413171228668 Intercept 0.5156317709594274 Prediction_local [0.27476633] Right: 0.4124075417514667 Intercept 0.4928221075351981 Prediction_local [0.27910357] Right: 0.48942337914084366 Intercept 0.5060558652412033 Prediction_local [0.19161545] Right: 0.01939843586572172 Intercept 0.5472428330497195 Prediction_local [0.22162227] Right: 0.1199144057079432 Intercept 0.4660795734033562 Prediction_local [0.33947828] Right: 0.450682775556964 Intercept 0.7461781405143785 Prediction_local [0.12987775] Right: 6.778844281864006e-07 Intercept 0.4990650237637184 Prediction_local [0.25171872] Right: 0.2525606550745758 Intercept 0.6809629518362499 Prediction_local [0.1885523] Right: 0.007713935631351011 Intercept 0.7363369159591442 Prediction_local [0.07563102] Right: 0.2590873256595246 Intercept 0.7215698289205768 Prediction_local [0.05023379] Right: 0.24052303584566156 Intercept 0.5562444807525124 Prediction_local [0.16236951] Right: 0.11559363750228419 Intercept 0.532871689148805 Prediction_local [0.16794912] Right: 0.1390331300933208 Intercept 0.5452257216211965 Prediction_local [0.14160485] Right: 0.2864823460830853 Intercept 0.4888380763299077 Prediction_local [0.3004461] Right: 0.3569925606529885 Intercept 0.5130438634990694 Prediction_local [0.27002773] Right: 0.4997052248276452 Intercept 0.4808097855930662 Prediction_local [0.28997081] Right: 0.3171159079127628 Intercept 0.4982695764375292 Prediction_local [0.27121977] Right: 0.5551839652056691 Intercept 0.4859429904112308 Prediction_local [0.28638876] Right: 0.5687586348594929 Intercept 0.7315231675193582 Prediction_local [0.06373099] Right: 0.0779976334697982 Intercept 0.7897741554948077 Prediction_local [-0.08759861] Right: 0.06104109556417649 Intercept 0.5246278911318245 Prediction_local [0.30464391] Right: 0.30338289914335903 Intercept 0.4956438674842466 Prediction_local [0.31877834] Right: 0.12489399019178392 Intercept 0.49307754154327127 Prediction_local [0.32015414] Right: 0.6861400196807915 Intercept 0.4929019204434578 Prediction_local [0.27784108] Right: 0.05177667529344764 Intercept 0.7131108787065248 Prediction_local [0.09619021] Right: 0.09351389816738544 Intercept 0.511222833237855 Prediction_local [0.20092958] Right: 0.4340090583428815 Intercept 0.47817035164754923 Prediction_local [0.35234035] Right: 0.6220700131296287 Intercept 0.7408559864343107 Prediction_local [0.05202625] Right: 0.00031941515634096717 Intercept 0.49759786415578616 Prediction_local [0.28829172] Right: 0.1819972183110554 Intercept 0.5397995966069609 Prediction_local [0.12620294] Right: 0.15420061947422015 Intercept 0.4982852671709168 Prediction_local [0.28520148] Right: 0.7688620975000908 Intercept 0.507303934478533 Prediction_local [0.2829089] Right: 0.15013276213566626 Intercept 0.4897783912781692 Prediction_local [0.28608847] Right: 0.10040145389543925 Intercept 0.4764556717043622 Prediction_local [0.31456819] Right: 0.07598495963200824 Intercept 0.5091794429133726 Prediction_local [0.26076605] Right: 0.22431635424475035 Intercept 0.4713067054484125 Prediction_local [0.3251543] Right: 0.00023479098466332114 Intercept 0.5768451360741982 Prediction_local [0.12910011] Right: 0.1244320288462463 Intercept 0.41316381298920585 Prediction_local [0.50933155] Right: 0.045679209115114444 Intercept 0.4639606657049733 Prediction_local [0.31054302] Right: 0.10715624891074296 Intercept 0.48780768460030555 Prediction_local [0.29730854] Right: 0.1266380725290087 Intercept 0.531285488667896 Prediction_local [0.13916997] Right: 0.28182120458592186 Intercept 0.4937025177919657 Prediction_local [0.25238393] Right: 0.15995615325040557 Intercept 0.44776298935253317 Prediction_local [0.40423738] Right: 0.15229179258584308 Intercept 0.48457268582201724 Prediction_local [0.24464468] Right: 0.3163135699676196 Intercept 0.5034843853730748 Prediction_local [0.28976811] Right: 0.139074956986112 Intercept 0.5079005378089332 Prediction_local [0.29149507] Right: 0.24364979654088262 Intercept 0.4941659217159724 Prediction_local [0.30261462] Right: 0.46400765615474693 Intercept 0.764050963199838 Prediction_local [0.04994106] Right: 2.0936291283455376e-07 Intercept 0.7416735831577718 Prediction_local [0.0520691] Right: 0.3691419170623119 Intercept 0.5682425963595874 Prediction_local [0.13269977] Right: 0.18774232880011205 Intercept 0.49647367023004 Prediction_local [0.26001074] Right: 0.39218001477193914 Intercept 0.5288316245986603 Prediction_local [0.27550907] Right: 0.5070564814409506 Intercept 0.5160344992100742 Prediction_local [0.31073995] Right: 0.0601440021046957 Intercept -0.04455142708834614 Prediction_local [0.94130852] Right: 0.9999098333441148 Intercept 0.4145882922026239 Prediction_local [0.5085405] Right: 0.011570435873470447 Intercept 0.5092745012723622 Prediction_local [0.22020492] Right: 0.11480497858868924 Intercept 0.481267543091498 Prediction_local [0.29863642] Right: 0.6348978143510634 Intercept 0.4839109142407275 Prediction_local [0.3159436] Right: 0.1885870531763859 Intercept 0.012961567023845655 Prediction_local [0.76706627] Right: 0.9941478398015429 Intercept 0.5623958999556784 Prediction_local [0.2159648] Right: 0.10863549903835017 Intercept 0.4718498197154432 Prediction_local [0.31597389] Right: 0.7252306703100575 Intercept 0.48215492841048346 Prediction_local [0.29579013] Right: 0.07794572684713803 Intercept 0.4135392864513416 Prediction_local [0.46450368] Right: 0.13040751929470465 Intercept 0.5050018200919306 Prediction_local [0.16239418] Right: 0.03927293105259758 Intercept 0.6666825790081228 Prediction_local [0.2440527] Right: 0.00022474743614415513 Intercept 0.7572927228724413 Prediction_local [0.05155025] Right: 0.09102598609312956 Intercept 0.7244869655327467 Prediction_local [0.039389] Right: 2.5633074341906986e-05 Intercept 0.43539926246595717 Prediction_local [0.40786494] Right: 0.052023161168488326 Intercept 0.49010089401646695 Prediction_local [0.27755981] Right: 0.2421700942585849 Intercept 0.49324531803088223 Prediction_local [0.29650598] Right: 0.08143641060437179 Intercept 0.5222986726467403 Prediction_local [0.27897203] Right: 0.39766305309715555 Intercept 0.4875245672798406 Prediction_local [0.30355727] Right: 0.27397528110966424 Intercept 0.5039684459657268 Prediction_local [0.30079419] Right: 0.22298904808996708 Intercept 0.5056294132359612 Prediction_local [0.28143634] Right: 0.7048970974588362 Intercept 0.7416871498818792 Prediction_local [0.04011031] Right: 7.214290485203828e-06 Intercept 0.5221889246974842 Prediction_local [0.17286247] Right: 0.42963152299892826 Intercept 0.2689668099449378 Prediction_local [0.5264673] Right: 0.8260594411786447 Intercept 0.4889452773245129 Prediction_local [0.31692756] Right: 0.1596020365076357 Intercept 0.5008100302288305 Prediction_local [0.26621445] Right: 0.2979771639473043 Intercept 0.4687892754150004 Prediction_local [0.31935802] Right: 0.41999703712863895 Intercept 0.7350774428327898 Prediction_local [0.05425761] Right: 0.007944349240350732 Intercept 0.5141434420830793 Prediction_local [0.2655164] Right: 0.42077302661952765 Intercept 0.4747633042603746 Prediction_local [0.30809406] Right: 0.0696251998523114 Intercept 0.5171934231429042 Prediction_local [0.29867712] Right: 0.4694490375588956 Intercept 0.7189832529376541 Prediction_local [0.04909146] Right: 1.0138320036729485e-05 Intercept 0.5346334651694731 Prediction_local [0.25086554] Right: 0.1689169481930614 Intercept 0.9388462125086634 Prediction_local [-0.01428372] Right: 5.101987067818338e-05 Intercept 0.4747433941355419 Prediction_local [0.37886519] Right: 0.19985211518818055 Intercept 0.5409130486237435 Prediction_local [0.16349121] Right: 0.5220409523897878 Intercept 0.47174754781498984 Prediction_local [0.32512045] Right: 0.08538502460881633 Intercept 0.5438709178826977 Prediction_local [0.15282803] Right: 0.4341103364788955 Intercept 0.4936061165392819 Prediction_local [0.31123393] Right: 0.10542139225124811 Intercept 0.5156608798903398 Prediction_local [0.27821858] Right: 0.19491074334781155 Intercept 0.563633792612269 Prediction_local [0.11312101] Right: 0.2504973016177744 Intercept 0.49580841510313134 Prediction_local [0.31927577] Right: 0.3716040428225577 Intercept 0.4803406279757086 Prediction_local [0.28468549] Right: 0.276129614245941 Intercept 0.8157975231007047 Prediction_local [-0.11985381] Right: 0.024848635763862852 Intercept 0.5420777021752797 Prediction_local [0.18564872] Right: 0.28403202003211125 Intercept 0.5321495207836482 Prediction_local [0.19902406] Right: 0.16588298274261434 Intercept 0.5471006476082709 Prediction_local [0.18462379] Right: 0.31977515382166094 Intercept 0.4209329109842053 Prediction_local [0.49048891] Right: 0.049600556808841414 Intercept 0.4937047530037828 Prediction_local [0.28566753] Right: 0.0282069551647402 Intercept 0.4717195591440709 Prediction_local [0.2949366] Right: 0.16761809956243273 Intercept 0.48820145522838865 Prediction_local [0.31445847] Right: 0.21501112593412375 Intercept 0.7578025464483045 Prediction_local [0.04198452] Right: 0.28578420170138885 Intercept 0.46892384992547953 Prediction_local [0.35628966] Right: 0.8071567939969101 Intercept 0.5019408130345097 Prediction_local [0.280265] Right: 0.25514778606015365 Intercept 0.771150754924196 Prediction_local [0.03439168] Right: 0.8225425296966937 Intercept 0.49437348846284823 Prediction_local [0.30190238] Right: 0.21781751288276838 Intercept 0.5357493171557863 Prediction_local [0.18377067] Right: 0.49776711898336456 Intercept 0.7270196963257447 Prediction_local [0.05511704] Right: 0.3224453448850742 Intercept 0.5238477415844919 Prediction_local [0.20746206] Right: 0.11414219069610883 Intercept 0.4511296215999457 Prediction_local [0.40611062] Right: 0.1921865610931952 Intercept 0.47600711407763374 Prediction_local [0.39198462] Right: 0.3135885771156538 Intercept 0.44830338006393194 Prediction_local [0.39118521] Right: 0.22895953431922667 Intercept 0.4879859445686605 Prediction_local [0.28967444] Right: 0.23375543615234884 Intercept 0.47357111963605797 Prediction_local [0.29786741] Right: 0.21715861826314548 Intercept 0.2240733677536604 Prediction_local [0.65424666] Right: 0.4055196271500787 Intercept 0.5019804965653933 Prediction_local [0.28208707] Right: 0.09356773773755295 Intercept 0.5839953329703748 Prediction_local [0.11506205] Right: 0.3836263438810028 Intercept 0.7467763519969635 Prediction_local [0.01345947] Right: 0.017926728759713902 Intercept 0.5068367304543206 Prediction_local [0.28491973] Right: 0.1921661737341335 Intercept 0.4804312749741615 Prediction_local [0.27301165] Right: 0.09420284268358113 Intercept 0.5739943976390842 Prediction_local [0.11727219] Right: 0.019250019087310458 Intercept 0.7488251173731375 Prediction_local [0.05231259] Right: 2.653876931264808e-07 Intercept 0.7497916992646559 Prediction_local [0.04005079] Right: 8.889167666386661e-05 Intercept 0.501706567159026 Prediction_local [0.279446] Right: 0.31342602346134113 Intercept 0.5433427522357055 Prediction_local [0.16596359] Right: 0.06436999056247111 Intercept 0.40456328973693817 Prediction_local [0.49568559] Right: 0.20077978116790618 Intercept 0.49377954998160367 Prediction_local [0.27957817] Right: 0.3007409291951833 Intercept 0.47893933117526327 Prediction_local [0.32417688] Right: 0.7072374006036491 Intercept 0.4938344469630962 Prediction_local [0.30647055] Right: 0.37953063347808896 Intercept 0.7269728635337912 Prediction_local [0.04814225] Right: 0.10668710329608942 Intercept 0.024292691093911523 Prediction_local [0.74781348] Right: 0.6031792494893772 Intercept 0.46076496749393175 Prediction_local [0.38029314] Right: 0.06315902337219516 Intercept 0.5646897109038921 Prediction_local [0.11880602] Right: 0.8075357047860946 Intercept 0.4860642824637367 Prediction_local [0.2937394] Right: 0.3565904884482609 Intercept 0.45710732437260454 Prediction_local [0.36345892] Right: 0.03365931210220547 Intercept 0.5447795652757315 Prediction_local [0.12631477] Right: 0.002972258358412104 Intercept 0.4586929622987569 Prediction_local [0.31680262] Right: 0.48503549077813113 Intercept 0.4905500274308067 Prediction_local [0.28692849] Right: 0.4701754671651484 Intercept 0.5153947575162661 Prediction_local [0.2883206] Right: 0.36399449570621595 Intercept -0.23045853535573302 Prediction_local [1.03387234] Right: 0.9999999976371226 Intercept 0.5351984662641023 Prediction_local [0.1492441] Right: 0.05883718469302894 Intercept 0.7587253704105748 Prediction_local [0.06075584] Right: 0.9495516285538124 Intercept 0.752749684018656 Prediction_local [0.05646583] Right: 1.010066916136682e-05 Intercept 0.5643903891548254 Prediction_local [0.11243446] Right: 0.011029705873183209 Intercept 0.4691753788646992 Prediction_local [0.38787337] Right: 0.255802876383581 Intercept 0.7624521258184421 Prediction_local [0.05967733] Right: 1.6334807060469813e-06 Intercept 0.518335190605919 Prediction_local [0.20062326] Right: 0.25931353696833886 Intercept 0.4579895729204765 Prediction_local [0.40316663] Right: 0.18194398441148196 Intercept 0.4360200433036743 Prediction_local [0.50388744] Right: 0.12922769562860884 Intercept 0.5107606117599849 Prediction_local [0.28942314] Right: 0.17535000166861106 Intercept 0.5001383063031265 Prediction_local [0.29220976] Right: 0.6951253500208994 Intercept 0.48006729161147116 Prediction_local [0.29850426] Right: 0.4921984679135414 Intercept 0.5090065413280259 Prediction_local [0.18915485] Right: 0.09281977967477627 Intercept 0.525696455539006 Prediction_local [0.25406273] Right: 0.26589933805161486 Intercept 0.5307710289405311 Prediction_local [0.2091022] Right: 0.05282473363287913 Intercept 0.4770591188479838 Prediction_local [0.31292413] Right: 0.07698085549686474 Intercept 0.5067633580757123 Prediction_local [0.30595798] Right: 0.4354764787604387 Intercept 0.5085163468787252 Prediction_local [0.2721254] Right: 0.2516915007104331 Intercept 0.48640796106745116 Prediction_local [0.28408507] Right: 0.11752408887981936 Intercept 0.5002360211090606 Prediction_local [0.27707149] Right: 0.17680234040807827 Intercept 0.46734372856486606 Prediction_local [0.30039145] Right: 0.2784599707568106 Intercept 0.442863675272263 Prediction_local [0.48394466] Right: 0.16157745695949613 Intercept 0.4753908582132625 Prediction_local [0.29887854] Right: 0.33028972315861116 Intercept 0.4561867885037078 Prediction_local [0.40602926] Right: 0.21152417764037937 Intercept 0.42466531078165615 Prediction_local [0.4729208] Right: 0.1451664729979968 Intercept 0.48339823096260526 Prediction_local [0.31994405] Right: 0.20806795791334728 Intercept 0.41703984574069636 Prediction_local [0.50140074] Right: 0.14002187692262788 Intercept 0.43533552941594084 Prediction_local [0.48705171] Right: 0.03030771662382068 Intercept 0.543802486954579 Prediction_local [0.16396557] Right: 0.33818059976397025 Intercept 0.4855702537821296 Prediction_local [0.28641283] Right: 0.18789214271790883 Intercept 0.5391505718520833 Prediction_local [0.22091228] Right: 0.10984131971017491 Intercept 0.5128382272340948 Prediction_local [0.27772748] Right: 0.39334035571502785 Intercept 0.5932711873657941 Prediction_local [0.11017745] Right: 0.05979114257851646 Intercept 0.7499511425890654 Prediction_local [0.0511521] Right: 0.001688858679794652 Intercept 0.7362814241601687 Prediction_local [0.04541868] Right: 3.102393341546575e-05 Intercept 0.47694833087403105 Prediction_local [0.37524302] Right: 0.29209781786140293
# Display the selected explanations
for i, exp in enumerate(sp_explanations):
print(f"\nSubmodular Pick Explanation {i+1} (Sample index: {sp_indices[i]})")
# display only one HTML representation, otherwise the size will be >10MB which is not allowed by Taxila portal
if i == 0 or i == 1:
display(IPython.display.HTML(exp.as_html()))
Submodular Pick Explanation 1 (Sample index: 968)
Submodular Pick Explanation 2 (Sample index: 2906)
Submodular Pick Explanation 3 (Sample index: 187) Submodular Pick Explanation 4 (Sample index: 1668) Submodular Pick Explanation 5 (Sample index: 1495) Submodular Pick Explanation 6 (Sample index: 1299) Submodular Pick Explanation 7 (Sample index: 2476) Submodular Pick Explanation 8 (Sample index: 960) Submodular Pick Explanation 9 (Sample index: 447) Submodular Pick Explanation 10 (Sample index: 569)
Task 4: For the same 5 points selected in Task 3, apply SHAP to explain the same outcomes.¶
# Create a SHAP explainer for the MLP model
# KernelExplainer is model-agnostic and works with any ML model
shap_explainer = shap.KernelExplainer(mlp.predict_proba, shap.sample(X_train, 100))
# Calculate SHAP values for the same 5 samples from Task 3
shap_values = shap_explainer.shap_values(samples)
# Display SHAP explanations for each sample
print("SHAP Explanations for the 5 selected samples:")
for i, (sample, label) in enumerate(zip(samples, sample_labels)):
print(f"\nSample {i+1} (True label: {label})")
# Create and display SHAP force plot for this sample
if i == 0: # Only display the first one as HTML to keep the notebook size manageable
# Alternative approach: Instead of using force_plot directly, use matplotlib plot
# This avoids the dimension matching issues
plt.figure(figsize=(12, 4))
# Plot the SHAP values as a bar chart
feature_names = list(X.columns)
sorted_indices = np.argsort(np.abs(shap_values[1][i]))
plt.barh(
[feature_names[j] for j in sorted_indices],
[shap_values[1][i][j] for j in sorted_indices]
)
plt.title(f"SHAP Values for Sample {i+1}")
plt.xlabel("SHAP Value (Impact on Prediction)")
plt.tight_layout()
plt.show()
# Print expected value for reference
print(f"Base value (average model output): {shap_explainer.expected_value[1]:.4f}")
# Calculate and print feature importance based on SHAP values
importance = np.abs(shap_values[1][i])
feature_importance = list(zip(X.columns, importance))
sorted_importance = sorted(feature_importance, key=lambda x: x[1], reverse=True)
print("Top 5 important features based on SHAP values:")
for feature, imp in sorted_importance[:5]:
print(f"{feature}: {imp:.4f}")
100%|██████████| 5/5 [00:00<00:00, 13.86it/s]
SHAP Explanations for the 5 selected samples: Sample 1 (True label: 1)
Base value (average model output): 0.2825 Top 5 important features based on SHAP values: Age: 0.0150 Experience: 0.0150 Sample 2 (True label: 0) Top 5 important features based on SHAP values: Experience: 0.0392 Age: 0.0392 Sample 3 (True label: 1) Top 5 important features based on SHAP values: Age: 0.0043 Experience: 0.0043 Sample 4 (True label: 0) Top 5 important features based on SHAP values: Experience: 0.0280 Age: 0.0280 Sample 5 (True label: 1) Top 5 important features based on SHAP values: Age: 0.0000 Experience: 0.0000
# Create summary visualizations for SHAP values
feature_names = list(X.columns)
# Approach 1: Simple bar plot of average SHAP values
# This approach is much more reliable and less prone to dimension issues
plt.figure(figsize=(12, 8))
mean_shap_values = np.abs(shap_values[1]).mean(0) # Average absolute SHAP values
sorted_idx = np.argsort(mean_shap_values)
plt.barh([feature_names[i] for i in sorted_idx], mean_shap_values[sorted_idx])
plt.title("Average Impact on Model Output (absolute SHAP values)")
plt.xlabel("|SHAP Value| (average impact on model output magnitude)")
plt.tight_layout()
plt.show()
# Approach 2: Individual feature importance plots for each sample
plt.figure(figsize=(15, 10))
for i, label in enumerate(sample_labels):
plt.subplot(3, 2, i+1)
sorted_idx = np.argsort(np.abs(shap_values[1][i]))
plt.barh([feature_names[j] for j in sorted_idx], shap_values[1][i][sorted_idx])
plt.title(f"Sample {i+1} (True label: {label})")
plt.tight_layout()
plt.subplots_adjust(hspace=0.5)
plt.tight_layout()
plt.show()
Comparison between LIME and SHAP Explanations¶
For the 5 randomly selected data points, we've now generated explanations using both LIME and SHAP techniques. Here are some observations about the explanations:
Consistency: Both LIME and SHAP tend to identify similar important features for the predictions, though they may rank them differently.
Interpretation:
- LIME provides local explanations by approximating the model with a simpler linear model around the prediction.
- SHAP values represent the contribution of each feature to the prediction based on cooperative game theory.
Feature Importance: While both methods show feature importance, SHAP values have theoretical guarantees (like additivity and consistency) that LIME does not provide.
Visualization: SHAP force plots show how each feature pushes the prediction higher or lower, while LIME shows contributions with positive and negative weights.
Global Insights: The SHAP summary plot allows us to see patterns across multiple instances, providing both local explanations and global model insights.
The combination of these two explanation methods gives us a more comprehensive understanding of the model's decision-making process.
Conclusion and Discussion¶
In this assignment, we developed a Multilayer Perceptron (MLP) for credit card approval prediction and explored several explainability techniques:
Model Performance: Our MLP achieved moderate accuracy on the credit card approval task, with a mean cross-validation accuracy of around 0.69 and a training error of about 0.19.
LIME Explanations: LIME provided local explanations for individual predictions by approximating the model with a simpler, interpretable model.
SHAP Explanations: SHAP values showed the contribution of each feature to the prediction based on game theory principles.
Exact Shapley Values: We also computed exact Shapley values for a subset of samples, which provide the mathematically optimal attribution of feature importance.
Comparison of Methods:
- LIME is computationally faster but provides approximate explanations
- SHAP offers stronger theoretical guarantees but can be computationally expensive
- Exact Shapley values provide the most mathematically sound explanations but are very computationally intensive
These explainability techniques help us understand how our black-box MLP model makes predictions, which is crucial for ensuring fairness, accountability, and transparency in machine learning applications, especially in sensitive domains like credit approval.